Neighborhood Components Analysis for Reward-Based Dimensionality Reduction

نویسنده

  • Nathan Sprague
چکیده

There has been a great deal of research that attempts to explain the structure of biological receptive fields in terms of various methods for adapting basis vectors based on the statistical structure of visual input. These include principal components analysis (Hancock et al., 1992), independent components analysis (Bell & Sejnowski, 1997), non-negative matrix factorization (Lee & Seung, 1999), and predictive coding (Rao & Ballard, 1999), among others. Typically, such approaches are based purely on the structure of the visual input; there is no consideration of the role that visual information plays in the goal directed behavior of an organism. The motivation for the current work is to explore mechanisms of basis vector adaptation that are explicitly driven by the behavioral demands of a situated agent.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

2D Dimensionality Reduction Methods without Loss

In this paper, several two-dimensional extensions of principal component analysis (PCA) and linear discriminant analysis (LDA) techniques has been applied in a lossless dimensionality reduction framework, for face recognition application. In this framework, the benefits of dimensionality reduction were used to improve the performance of its predictive model, which was a support vector machine (...

متن کامل

Dimensionality reduction for speech recognition using neighborhood components analysis

Previous work has considered methods for learning projections of high-dimensional acoustic representations to lower dimensional spaces. In this paper we apply the neighborhood components analysis (NCA) [2] method to acoustic modeling in a speech recognizer. NCA learns a projection of acoustic vectors that optimizes a criterion that is closely related to the classification accuracy of a nearest-...

متن کامل

Spatial Distance Preservation based Methods for Non- Linear Dimensionality Reduction

The preservation of the pairwise distances measured in a data set ensures that the low dimensional embedding inherits the main geometric properties of the data like the local neighborhood relationships. In this paper, distance preserving technique namely, Sammons nonlinear mapping (Sammon‟s NLM) and Curvilinear Component Analysis (CCA) have been discussed and compared for non-linear dimensional...

متن کامل

Fast Discriminative Component Analysis for Comparing Examples

Two recent methods, Neighborhood Components Analysis (NCA) and Informative Discriminant Analysis (IDA), search for a class-discriminative subspace or discriminative components of data, equivalent to learning of distance metrics invariant to changes perpendicular to the subspace. Constraining metrics to a subspace is useful for regularizing the metrics, and for dimensionality reduction. We intro...

متن کامل

Image feature optimization based on nonlinear dimensionality reduction

Image feature optimization is an important means to deal with high-dimensional image data in image semantic understanding and its applications. We formulate image feature optimization as the establishment of a mapping between highand low-dimensional space via a five-tuple model. Nonlinear dimensionality reduction based on manifold learning provides a feasible way for solving such a problem. We ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007